Learning Analytics by Johann Ari Larusson & Brandon White

Learning Analytics by Johann Ari Larusson & Brandon White

Author:Johann Ari Larusson & Brandon White
Language: eng
Format: epub
Publisher: Springer New York, New York, NY


To develop the tools necessary to make pedagogical changes, current research indicates that models ought to be developed that consider “informative feedback [because it] is more effective in teaching desirable outcomes, and is perceived as more valuable by learners” (Tanes et al. 2011, p. 2415). As an example of an informative feedback system currently utilized, Purdue University’s Signals operates in conjunction with the LMS to “provide both performance and outcome oriented feedback to students” (Tanes et al. 2011, p. 2415). This means that while instructors specify the parameters of performance for student feedback, that same data is used to assess whether the pedagogy being employed is effective for large groups of students. If an instructor has many “red lights” in the signal system (indicating students are in need of immediate performance alteration), then that instructor should realize that a pedagogical shift may be necessary in order for students to realize greater success. Pedagogical shifts that are made with “informed change” help instructors “provide evidence on which to form understanding and make informed (rather than instinctive) decisions” (van Harmelen and Workman 2012, p. 17). The benefit of learning analytics for instructors is the production and promulgation of hard data that allows for alterations in teaching method to be employed relatively quickly.

As a formalized system of research, learning analytics is relatively new. Dawson et al. (2009) acknowledge the “scarcity of resources available that can readily assist teachers in rapidly evaluating learning progress and behavior in order to better design learning activities to provide a more personalized and relevant learning environment” (p. 191). As more historical student performance data becomes available to researchers, better algorithms likely will be developed. Recent work in causal models have “identified links between certain measurable data attributes describing past student behavior and the performance of a student” but this, too, “is dependent on a body of historical data” (van Harmelen and Workman 2012, p. 17).

Preliminary results of measuring how learning analytics provide actionable data to instructors indicate that “student success was associated with instructional rather than motivational feedback, and type of rather than frequency of summative and formative feedback” (Tanes et al. 2011, p. 2420). The challenge of analytical data for these purposes is the sheer amount of “comprehensive” data needed to make the case (Ali et al. 2012, p. 470). Multiple data points help bring out statistically significant patterns to refine algorithms relevant for feedback tools that impact pedagogy; the problem in the intermediary time, though, is the amount of data needed to compute such multivariate algorithms and a consensus on which data points are most useful.

As the ongoing work of learning analytics is used to help improve pedagogical practices, one of the important caveats to the research is ensuring that the data employed by instructors does not discourage students. Greller and Drachsler (2012) are quite emphatic on this point because they see that statistical modeling may box in “individual teachers or learners against a statistical norm” with the possible result of “strongly stifl[ing] innovation, individuality, creativity, and experimentation that are so important in driving learning and teaching developments…” (p.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.